59 research outputs found

    Generating Functions For Kernels of Digraphs (Enumeration & Asymptotics for Nim Games)

    Full text link
    In this article, we study directed graphs (digraphs) with a coloring constraint due to Von Neumann and related to Nim-type games. This is equivalent to the notion of kernels of digraphs, which appears in numerous fields of research such as game theory, complexity theory, artificial intelligence (default logic, argumentation in multi-agent systems), 0-1 laws in monadic second order logic, combinatorics (perfect graphs)... Kernels of digraphs lead to numerous difficult questions (in the sense of NP-completeness, #P-completeness). However, we show here that it is possible to use a generating function approach to get new informations: we use technique of symbolic and analytic combinatorics (generating functions and their singularities) in order to get exact and asymptotic results, e.g. for the existence of a kernel in a circuit or in a unicircuit digraph. This is a first step toward a generatingfunctionology treatment of kernels, while using, e.g., an approach "a la Wright". Our method could be applied to more general "local coloring constraints" in decomposable combinatorial structures.Comment: Presented (as a poster) to the conference Formal Power Series and Algebraic Combinatorics (Vancouver, 2004), electronic proceeding

    Image Watermaking With Biometric Data For Copyright Protection

    Full text link
    In this paper, we deal with the proof of ownership or legitimate usage of a digital content, such as an image, in order to tackle the illegitimate copy. The proposed scheme based on the combination of the watermark-ing and cancelable biometrics does not require a trusted third party, all the exchanges are between the provider and the customer. The use of cancelable biometrics permits to provide a privacy compliant proof of identity. We illustrate the robustness of this method against intentional and unintentional attacks of the watermarked content

    Assessment of Translocator Protein Density, as Marker of Neuroinflammation, in Major Depressive Disorder: A Pilot, Multicenter, Comparative, Controlled, Brain PET Study (INFLADEP Study)

    Get PDF
    Background: Major depressive disorder (MDD) is a serious public health problem with high lifetime prevalence (4.4–20%) in the general population. The monoamine hypothesis is the most widespread etiological theory of MDD. Also, recent scientific data has emphasized the importance of immuno-inflammatory pathways in the pathophysiology of MDD. The lack of data on the magnitude of brain neuroinflammation in MDD is the main limitation of this inflammatory hypothesis. Our team has previously demonstrated the relevance of [18F] DPA-714 as a neuroinflammation biomarker in humans. We formulated the following hypotheses for the current study: (i) Neuroinflammation in MDD can be measured by [18F] DPA-714; (ii) its levels are associated with clinical severity; (iii) it is accompanied by anatomical and functional alterations within the frontal-subcortical circuits; (iv) it is a marker of treatment resistance.Methods: Depressed patients will be recruited throughout 4 centers (Bordeaux, Montpellier, Tours, and Toulouse) of the French network from 13 expert centers for resistant depression. The patient population will be divided into 3 groups: (i) experimental group—patients with current MDD (n = 20), (ii) remitted depressed group—patients in remission but still being treated (n = 20); and, (iii) control group without any history of MDD (n = 20). The primary objective will be to compare PET data (i.e., distribution pattern of neuroinflammation) between the currently depressed group and the control group. Secondary objectives will be to: (i) compare neuroinflammation across groups (currently depressed group vs. remitted depressed group vs. control group); (ii) correlate neuroinflammation with clinical severity across groups; (iii) correlate neuroinflammation with MRI parameters for structural and functional integrity across groups; (iv) correlate neuroinflammation and peripheral markers of inflammation across groups.Discussion: This study will assess the effects of antidepressants on neuroinflammation as well as its role in the treatment response. It will contribute to clarify the putative relationships between neuroinflammation quantified by brain neuroimaging techniques and peripheral markers of inflammation. Lastly, it is expected to open innovative and promising therapeutic perspectives based on anti-inflammatory strategies for the management of treatment-resistant forms of MDD commonly seen in clinical practice.Clinical trial registration (reference: NCT03314155): https://www.clinicaltrials.gov/ct2/show/NCT03314155?term=neuroinflammation&cond=depression&cntry=FR&rank=

    Rheumatoid arthritis - treatment: 180. Utility of Body Weight Classified Low-Dose Leflunomide in Japanese Rheumatoid Arthritis

    Get PDF
    Background: In Japan, more than 20 rheumatoid arthritis (RA) patients died of interstitial pneumonia (IP) caused by leflunomide (LEF) were reported, but many of them were considered as the victims of opportunistic infection currently. In this paper, efficacy and safety of low-dose LEF classified by body weight (BW) were studied. Methods: Fifty-nine RA patients were started to administrate LEF from July 2007 to July 2009. Among them, 25 patients were excluded because of the combination with tacrolimus, and medication modification within 3 months before LEF. Remaining 34 RA patients administered 20 to 50 mg/week of LEF were followed up for 1 year and enrolled in this study. Dose of LEF was classified by BW (50 mg/week for over 50 kg, 40 mg/week for 40 to 50 kg and 20 to 30 mg/week for under 40 kg). The average age and RA duration of enrolled patients were 55.5 years old and 10.2 years. Prednisolone (PSL), methotrexate (MTX) and etanercept were used in 23, 28 and 2 patients, respectively. In case of insufficient response or adverse effect, dosage change or discontinuance of LEF were considered. Failure was defined as dosages up of PSL and MTX, or dosages down or discontinuance of LEF. Last observation carried forward method was used for the evaluation of failed patients at 1 year. Results: At 1 year after LEF start, good/ moderate/ no response assessed by the European League Against Rheumatism (EULAR) response criteria using Disease Activity Score, including a 28-joint count (DAS28)-C reactive protein (CRP) were showed in 14/ 10/ 10 patients, respectively. The dosage changes of LEF at 1 year were dosage up: 10, same dosage: 5, dosage down: 8 and discontinuance: 11 patients. The survival rate of patients in this study was 23.5% (24 patients failed) but actual LEF continuous rate was 67.6% (11 patients discontinued) at 1 year. The major reason of failure was liver dysfunction, and pneumocystis pneumonia was occurred in 1 patient resulted in full recovery. One patient died of sepsis caused by decubitus ulcer infection. DAS28-CRP score was decreased from 3.9 to 2.7 significantly. Although CRP was decreased from 1.50 to 0.93 mg/dl, it wasn't significant. Matrix metalloproteinase (MMP)-3 was decreased from 220.0 to 174.2 ng/ml significantly. Glutamate pyruvate transaminase (GPT) was increased from 19 to 35 U/l and number of leukocyte was decreased from 7832 to 6271 significantly. DAS28-CRP, CRP, and MMP-3 were improved significantly with MTX, although they weren't without MTX. Increase of GPT and leukopenia were seen significantly with MTX, although they weren't without MTX. Conclusions: It was reported that the risks of IP caused by LEF in Japanese RA patients were past IP history, loading dose administration and low BW. Addition of low-dose LEF is a potent safe alternative for the patients showing unsatisfactory response to current medicines, but need to pay attention for liver function and infection caused by leukopenia, especially with MTX. Disclosure statement: The authors have declared no conflicts of interes

    Equivalence classes of Boolean functions for first-order correlation

    No full text
    International audienceThis paper presents a complete characterization of the first order correlation immune Boolean functions that includes the functions that are 1-resilient. The approach consists in defining an equivalence relation on the full set of Boolean functions with a fixed number of variables. An equivalence class in this relation, called a first-order correlation class, provides a measure of the distance between the Boolean functions it contains and the correlation-immune Boolean functions. The key idea consists on manipulating only the equivalence classes instead of the set of Boolean functions. To achieve this goal, a class operator is introduced to construct a class with n variables from two classes of n - 1 variables. In particular, the class of 1-resilient functions on n variables is considered. An original and efficient method to enumerate all the Boolean functions in this class is proposed by performing a recursive decomposition of classes with less variables. A bottom up algorithm provides the exact number of 1-resilient Boolean functions with seven variables which is 23478015754788854439497622689296. A tight estimation of the number of 1-resilient functions with eight variables is obtained by performing a partial enumeration. It is conjectured that the exact complete enumeration for general n is intractabl

    Quelques études de l'aléatoire en informatique

    No full text
    L'objectif de cette présentation est de montrer le rôle central de l'aléatoire dans des domaines de recherche en combinatoire, algorithmique et complexité. Nous verrons également qu'il intervient dans des domaines d'application comme la cryptographie, le tatouage et la biométrie. Selon les études proposées, ce sont des aspects différents de l'aléatoire qui entrent en jeu comme la modélisation aléatoire et le calcul de probabilités asymptotiques, la génération aléatoire ou encore l'extraction de l'aléatoire des données.Une première partie sera consacrée à l'étude des noyaux dans les graphes aléatoires. Un noyau est un ensemble de sommets qui vérifie deux propriétés très connues en théorie des graphes : la stabilité et la dominance, ces deux propriétés s'opposent et limitent les tailles possibles des noyaux. Nous verrons comment modifier ces tailles soit en proposant des variantes de la propriété de noyau (résultats de contrexemples de lois 0-1 en théorie des modèles finis et en logiques modales, obtention de transitions de phase), soit en changeant ladistribution de probabilités (graphes sans circuit, graphes creux, graphes denses). Nous verrons quel impact que cela entraîne sur la complexité des algorithmes de recherche de noyau.La seconde partie porte sur les classes de fonctions booléennes définies selon des propriétés utiles pour la cryptographie symétrique.L'objectif est l'énumération et la génération aléatoire uniforme de ces classes de fonctions. Nous verrons qu'il est possible d'énumérer et générer efficacement les fonctions 1-résilientes jusqu'à 8 variables. L'originalité de notre méthode -- à la fois combinatoire et algorithmique -- que nous avons appelée méthode des classes, a été de classifier l'ensemble des fonctions booléennes en fonction de leur écart avec les fonctions 1-résilientes.Nous nous intéressons dans la troisième partie à l'étude de l'aléatoire des données, ce travail s'inscrit dans des co-encadrements de thèse. Il s'agit d'éviter une modélisation aléatoire difficile et peu fidèle et de déterminer les parties aléatoires de ses données. La thèse de Cyril Bazin (soutenue en 2010) propose une méthode de tatouage de données géographiques vectorielles qui est rapide, aveugle et robuste à la rotation et à la translation. La méthode consiste à introduire un biais statistique sur des petites parties du document appelés sites. Les thèses de Zhigang Yao (soutenue en 2015) et Benoît Vibert (en cours) portent sur des données biométriques --les empreintes digitales-- plus précisément sur les minuties extraites de ces empreintes. Nous verrons comment mesurer la qualité d'une empreinte et comment sélectionner les minuties les plus pertinentes.Nous proposerons finalement un projet de recherche sur la quantification et la classification de l'aléatoire des données provenant de transactions numériques
    corecore